(The UNIFORM DISTRIBUTION): f(x)={R−L10L≤x≤ROtherwise
(The NORMAL(μ,σ2) DISTRIBUTION PDF): f(x)=σ2π1exp(−2σ2(x−μ)2) where μ is mean, σ2 is the variance.
μ=0,σ=1 is the Standard Normal distribution
∫−∞∞exp(−2σ2(x−μ)2)=2πσ
U=∑i=1naiXi and V=∑i=1nbiXi, Cov(U,V)=∑i=1naibiσi2. For normal distribution only , Cor(U,V)=0⟹U,V are independent
the sum of normal distribution still is normal, where ∑iXi∼N(∑iμi,∑iσi2)
(The EXPONENTIAL DISTRIBUTION): f(x)={λexp(−λx)0x≥0x<0
(The GAMMA DISTRIBUTION): f(x)=Γ(α)λαxα−1exp(−λx);α,λ>0
Γ(α)=∫0∞tα−1e−tdt,α>0
Γ(α+1)=αΓ(α), that is, ∀n∈N,Γ(n)=(n−1)!
Γ(1/2)=π
Cumulative Distribution Function(CDF) is the function to interpret subset like (−∞,x] which write as FX:R→[0,1], which defined FX(x)=P(X≤x)=∫−∞xf(y)dy
According the definition we find that P(a≤X≤b)=FX(b)−FX(a)
(The BETA DISTRIBUTION): f(x)=B(α,β)1xα−1(1−x)β−1=Γ(α)Γ(β)Γ(α+β)xα−1(1−x)β−1
Beta function: B(α,β)=∫01tα−1(1−t)β−1dt=Γ(α+β)Γ(α)Γ(β) and can be extended to even more
(The BIVARIATE NORMAL DISTRIBUTION): Let X, Y be random variables with Normal distribution, means μ1,μ2 respectively, and variances σ12,σ22. Let p be their correlation defined as −1≤p≤1, that is, we have a specific joint distribution fX,Y(x,y)=2πσ1σ21−p21×exp(−2(1−p2)1[(σ1x−μ1)2+(σ2y−μ2)2−2p(σ1x−μ1)(σ2y−μ2)])
Let X and Y be jointly absolutely continuous random variables with the joint density function fX,Y(x,y). The conditional density of Y given that X=x is the function fY∣X(y∣x)=fX(x)fX,Y(x,y)
Sample distribution is a probability distribution on sample statistics. More formally, let Y=h(x1,…,hn) be any function. The probability distribution of Y is called a Sampling Distribution.
Stand error of Normal Distribution among samples SD=σ/n
(Chi-Square Distribution): let Zi∼N(0,1),Y=g(Zi)=∑i=1kZi2 where Zi are independent to each other then Y∼X2(k)
k is the degree of freedom. Y∼Gamma(k/2,k/2)
Let Xˉ be the mean and s2=n−1∑i=1n(Xi−Xˉ)2=TSS/(n−1) is the sample variance where Xi∼i.i.dN(μ,σ2). Then (n−1)s2/σ2∼X2(n−1) . Furthermore, mean and sample variance are independent.
E(Y)=k,Var(Y)=2k
(T-Distribution): Let X,X1,…,Xn∼i.i.dN(0,1) so that we have Y=∑i=1nXi2,Y∼X2(n). We define X/Y/n∼tn. Let U=X/Y/n, then we have the PDF fU(u)=πΓ(n/2)Γ((n+1)/2)(1+u2/n)−(n+1)/2n−1/2
n→∞limtn→N(0,1)
n=1⟹ Cauchy Distribution
(F-Distribution): Let X1,…,Xm∼i.i.dN(0,1) and let Y1,…,Yn∼i.i.dN(0,1). Let Zx=∑i=1mXi2,Zy=∑i=1nYi2. we define Zy/nZx/m∼F(m,n). Let U=Zy/nZx/m, then we have PDF fU(u)=Γ(m/2)Γ(n/2)Γ((m+n)/2)(nmu)2m−1(1+nmu)−2m+n(nm) for u>0
U∼F(m,n)⟹1/U∼F(n,m)
Un∼F(m,n)⟹limn→∞mUn∼X2(m)
U∼tk⟹U2∼F(1,k)
Try to prove all of those sampling distribution use change of variable.
Let X1,…,Xn be i.i.d sequence of Random variables. Let n be non-negative integer valued random variable independent of {Xi}. Then S=∑i=1nXi is a random variable with compound distribution.
E[S]=E[n]E[X1]
E[S]=E[∑i=1nXi] where E[Xi]=E[X1]<∞ and define a indicator function Ii=I1,…,n(i)
Then E[S]=E[∑i=1∞IiX1]=∑i=1∞E[Ii]E[X1]=E[X1]∑i=1∞E[Ii]=E[X1]E[∑i=1∞Ii]=E[S]=E[n]E[X1] where n and X1 are independent.
mS(t)=rn(mXi(t))
mS(t)=E[etS]=E[exp(t∑i=1nXi)]=E[E[exp(t∑i=1nXi)∣N]]=∑j=0∞P(n=j)E[exp(∑i=1nXit)∣N=j]=∑j=0∞P(n=j)E[exp(∑i=1nXit)]=∑j=0∞P(n=j)[mX1(t)]n=rn(mX1(t)) where rn=∑j=0∞jP(n=j)
Let X1,…,Xn be sequence of Random variables with CDFs F1,…,Fn. Let p1,…,pn be positive real numbers such that ∑i=1npi=1. Then G(x)=p1F1(x)+⋯+pnFn(x) is the CDF of the mixture distribution.
Is the random variable continuous or discrete?
e,g. 2.5.6 from E&R: Let X1∼Poisson(3) with cdf F1. Let X2∼N(0,1) with CDF F2 where p1=1/5,p2=4/5. Let Y be the random vairable with mixtrue distribution with CDF G(x)=F1(x)+4/5F2(x).
If Y is continuous, P(Y=y)=0=FY(y)−FY(y−)=1/5F1(y)+4/5F2(y)−1/5F1(y−)−4/5F2(y−)=1/5(F1(y)−F1(y−))+4/5(F2(y)−F2(y−))=1/5P(X1=y)+0 where X2 is continuous$
P(X1=y)=y!3ye−3 for y≥0, then P(Y=y)={51y!3ye−30y≥0Others so that Y is not continuous.
If Y is discrete, ∑y≥0∞P(Y=y)=∑y≥0∞51y!3ye−3=1/5 which is not equal to 1 so that Y is not discrete.